Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(event cache): automatically shrink a room's linked chunk when all subscribers are gone #4703

Merged
merged 1 commit into from
Feb 24, 2025

Conversation

bnjbvr
Copy link
Member

@bnjbvr bnjbvr commented Feb 20, 2025

On top of #4694.

Every subscriber to a RoomEventCache now gets an atomic counter of the number of listeners to updates to this room. When the count reaches 0, the listener will notify a task (living at the EventCache level, to avoid having one task per room) that the room's linked chunk may be shrunk, if the number of listeners is still 0.

Part of #3280.

@bnjbvr bnjbvr requested a review from a team as a code owner February 20, 2025 16:13
@bnjbvr bnjbvr requested review from Hywan and removed request for a team February 20, 2025 16:13
Copy link

codecov bot commented Feb 20, 2025

Codecov Report

Attention: Patch coverage is 82.35294% with 9 lines in your changes missing coverage. Please review.

Project coverage is 85.94%. Comparing base (f3f37a3) to head (c78fc0f).
Report is 14 commits behind head on main.

Files with missing lines Patch % Lines
crates/matrix-sdk/src/event_cache/mod.rs 83.33% 4 Missing ⚠️
crates/matrix-sdk/src/event_cache/room/mod.rs 84.61% 4 Missing ⚠️
crates/matrix-sdk-ui/src/timeline/builder.rs 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #4703      +/-   ##
==========================================
- Coverage   85.94%   85.94%   -0.01%     
==========================================
  Files         290      290              
  Lines       33892    33942      +50     
==========================================
+ Hits        29130    29170      +40     
- Misses       4762     4772      +10     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Comment on lines 78 to 82
// I hear you from the future: "but, spawning a detached task in a drop
// implementation is real bad! Maybe there will be multiple shrinks
// happening at the same time, and that's bad!". However, this can't
// happen, because the whole `state` variable is guarded by a fair lock,
// which will run queries in the order they happen. Should be fine™.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You forgot to explain why you need to spawn here. It's because drop cannot be async, and you're doing async operations.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, introduced a mechanism with one (1) task listening forever in the event cache, that should be more sound.

@bnjbvr bnjbvr force-pushed the bnjbvr/unload-after-all-subscriptions-gone branch 2 times, most recently from 291665b to dcf80a6 Compare February 24, 2025 15:37
@bnjbvr bnjbvr requested a review from Hywan February 24, 2025 15:42
Copy link
Member

@Hywan Hywan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Excellent :-]. Thank you!

@bnjbvr bnjbvr force-pushed the bnjbvr/unload-after-all-subscriptions-gone branch 2 times, most recently from 4674ba3 to c78fc0f Compare February 24, 2025 16:25
@bnjbvr bnjbvr enabled auto-merge (rebase) February 24, 2025 16:26
@bnjbvr bnjbvr merged commit 5dd5710 into main Feb 24, 2025
42 checks passed
@bnjbvr bnjbvr deleted the bnjbvr/unload-after-all-subscriptions-gone branch February 24, 2025 16:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants